mg2vec: Learning Relationship-Preserving Heterogeneous Graph Representations via Metagraph Embedding

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Graph Representations with Embedding Propagation

Label Representations • Let l ∈ Rd be the representation of label l, and f be a differentiable embedding function • For labels of label type i, we apply a learnable embedding function l = fi(l) • hi(v) is the embedding of label type i for vertex v: hi(v) = gi ({l | l ∈ labels of type i associated with vertex v}) • h̃i(v) is the reconstruction of the embedding of label type i for vertex v: h̃i(v) ...

متن کامل

Graph attribute embedding via Riemannian submersion learning

In this paper, we tackle the problem of embedding a set of relational structures into a metric space for purposes of matching and categorisation. To this end, we view the problem from a Riemannian perspective and make use of the concepts of charts on the manifold to define the embedding as a mixture of class-specific submersions. Formulated in this manner, the mixture weights are recovered usin...

متن کامل

Graph Embedding aided Relationship Prediction in Heterogeneous Networks

We consider the problem of predicting relationships in largescale heterogeneous networks. For example, one can try to predict if a researcher will publish at a conference (eg: VLDB) given her previous publications, or try to anticipate if two reputed researchers working in the same area will collaborate. The main challenge is to extract latent information from such real-world networks which are...

متن کامل

Embedding Heterogeneous Data by Preserving Multiple Kernels

Heterogeneous data may arise in many real-life applications under different scenarios. In this paper, we formulate a general framework to address the problem of modeling heterogeneous data. Our main contribution is a novel embedding method, called multiple kernel preserving embedding (MKPE), which projects heterogeneous data into a unified embedding space by preserving crossdomain interactions ...

متن کامل

Word Representations via Gaussian Embedding

Current work in lexical distributed representations maps each word to a point vector in low-dimensional space. Mapping instead to a density provides many interesting advantages, including better capturing uncertainty about a representation and its relationships, expressing asymmetries more naturally than dot product or cosine similarity, and enabling more expressive parameterization of decision...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Knowledge and Data Engineering

سال: 2020

ISSN: 1041-4347,1558-2191,2326-3865

DOI: 10.1109/tkde.2020.2992500